Iterative Reweighted Linear Least Squares for Exact Penalty Subproblems on Product Sets
نویسندگان
چکیده
We present two matrix-free methods for solving exact penalty subproblems on product sets that arise when solving large-scale optimization problems. The first approach is a novel iterative reweighting algorithm (IRWA), which iteratively minimizes quadratic models of relaxed subproblems while automatically updating a relaxation vector. The second approach is based on alternating direction augmented Lagrangian (ADAL) technology applied to our setting. The main computational costs of each algorithm are the repeated minimizations of convex quadratic functions which can be performed matrix-free. We prove that both algorithms are globally convergent under loose assumptions and that each requires at most O(1/ε2) iterations to reach ε-optimality of the objective function. Numerical experiments exhibit the ability of both algorithms to efficiently find inexact solutions. However, in certain cases, these experiments indicate that IRWA can be significantly more efficient than ADAL.
منابع مشابه
Global least squares solution of matrix equation $sum_{j=1}^s A_jX_jB_j = E$
In this paper, an iterative method is proposed for solving matrix equation $sum_{j=1}^s A_jX_jB_j = E$. This method is based on the global least squares (GL-LSQR) method for solving the linear system of equations with the multiple right hand sides. For applying the GL-LSQR algorithm to solve the above matrix equation, a new linear operator, its adjoint and a new inner product are dened. It is p...
متن کاملSuperlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis
We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...
متن کاملUsing an Efficient Penalty Method for Solving Linear Least Square Problem with Nonlinear Constraints
In this paper, we use a penalty method for solving the linear least squares problem with nonlinear constraints. In each iteration of penalty methods for solving the problem, the calculation of projected Hessian matrix is required. Given that the objective function is linear least squares, projected Hessian matrix of the penalty function consists of two parts that the exact amount of a part of i...
متن کاملPenalty Constraints and Kernelization of M-Estimation Based Fuzzy C-Means
A framework of M-estimation based fuzzy C–means clustering (MFCM) algorithm is proposed with iterative reweighted least squares (IRLS) algorithm, and penalty constraint and kernelization extensions of MFCM algorithms are also developed. Introducing penalty information to the object functions of MFCM algorithms, the spatially constrained fuzzy c-means (SFCM) is extended to penalty constraints MF...
متن کاملSparse Bayes estimation in non-Gaussian models via data augmentation
In this paper we provide a data-augmentation scheme that unifies many common sparse Bayes estimators into a single class. This leads to simple iterative algorithms for estimating the posterior mode under arbitrary combinations of likelihoods and priors within the class. The class itself is quite large: for example, it includes quantile regression, support vector machines, and logistic and multi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM Journal on Optimization
دوره 25 شماره
صفحات -
تاریخ انتشار 2015